579 research outputs found

    Selection and Demography Drive Range-Wide Patterns of Mhc Variation in Mule Deer (odocoileus Hemionus)

    Get PDF
    Variation at functional genes involved in immune response is of increasing concern as wildlife diseases continue to emerge and threaten populations. The amount of standing genetic variation in a population is directly associated with its potential for rapid adaptation to novel environments. For genes in the major histocompatibility complex (MHC), which are crucial in activating the immune response and which have extremely high levels of polymorphism, the genetic variation has been shown to be influenced by both parasite-mediated selection and historical population demography. To better understand the relative roles of parasite-mediated selection and demography on MHC evolution in large populations, I analyzed geographic patterns of variation at the MHC DRB class II locus in mule deer (Odocoileus hemionus). I identified 31 new MHC-DRB alleles which were phylogenetically similar to other cervid MHC alleles, and I found 1 allele that was shared with white-tailed deer (Odocoileus virginianus). I found evidence for selection on the MHC based on high dN/dS ratios, positive neutrality tests, deviations from Hardy-Weinberg Equilibrium (HWE) and greater isolation-by-distance (IBD) than expected under neutrality. However, I also saw evidence that historical demography is important in shaping variation at the MHC, in the similar variation structures between MHC and microsatellites and the lack of significant environmental drivers of variation at either locus. These results show that both natural selection and historical demography are important drivers in the evolution of the MHC in mule deer and may aid in predicting how future adaptation is shaped when this species is confronted with environmental challenges

    "Any lady can do this without much trouble ...": class and gender in The dining room (1878)

    Get PDF
    Macmillan's "Art at Home" series (1876–83) was a collection of domestic advice manuals. Mentioned in every study of the late-nineteenth-century domestic interior, they have often been interpreted, alongside contemporary publications such as Charles Eastlake's Hints on Household Taste (1868), as indicators of late 1870s home furnishing styles. Mrs Loftie's The Dining Room (1878) was the series' fifth book and it considers one of the home's principal (and traditionally masculine) domestic spaces. Recent research on middle-class cultural practices surrounding food has placed The Dining Room within the tradition of Mrs Beeton's Household Management (1861); however, it is not a cookery book and hardly mentions dinners. Drawing upon unpublished archival sources, this paper charts the production and reception of The Dining Room, aiming to unravel its relationships with other contemporary texts and to highlight the difficulties of using it as historical evidence. While it offers fascinating insights into contemporary taste, class and gender, this paper suggests that, as an example of domestic design advice literature, it reveals far more about the often expedient world of nineteenth-century publishing practices

    Evaluation of headache severity after aneurysmal subarachnoid hemorrhage

    Get PDF
    AbstractObjectiveThe most common complaint from patients after subarachnoid hemorrhage (SAH) is headache. The headache appears to be persistent and often severe. Although this problem is pervasive in the care of SAH patients, very little data have been published to describe the nature and severity of the headache nor is there evidence-based guidance on the appropriate treatment of headache due to SAH.MethodsThis was a retrospective medical record review of adults with aneurysmal SAH. Basic demographics, along with pain scores and analgesic medication administration, were collected. Patients with early vasospasm (within 7days of ictus) were compared with patients with no vasospasm.ResultsThe patient population was characteristic of the typical SAH population. Approximately 31.5% of patients exhibited early vasospasm. These patients had higher pain scores (median 8/10) than did patients without vasospasm (median 6/10) and required more analgesics such as acetaminophen/butalbital/caffeine. Treatment success with any analgesic used in this population was minimal. The pain scores associated with headache increased over the first 7days in both groups.ConclusionsHeadache after SAH is persistent and treatment refractory. There may be an association with development of vasospasm and worsening of headache. Novel treatment strategies to attenuate headache in this population are needed

    Early reconstitution of effector memory CD4+ CMV-specific T cells protects against CMV reactivation following allogeneic SCT.

    Get PDF
    Reactivation of CMV is a common complication following allogeneic haematopoietic SCT and is associated with significant morbidity and mortality. The relative importance of the CD4+ and CD8+ components of the CMV-specific immune response in protection from reactivation is unclear. The CMV-specific CD4+ and CD8+ immune response was measured at serial time points in 32 patients following allogeneic HSCT. Intracellular cytokine staining following CMV lysate stimulation and HLA-peptide tetramers were used to determine CMV-specific CD4+ and CD8+ responses, respectively. A deficient CMV-specific CD4+ T-cell immune response within the first 30-50 days post transplant was associated with high risk of viral reactivation. Patients with combined impairment of the CD4+ and CD8+ immune response within the first 100 days were susceptible to late viral reactivation. The frequency of CMV-specific CD4+ T cells correlated with CMV-specific CD8+ T cells, comprising 10% of the whole T-cell repertoire. Early CMV-specific CD4+ T-cell reconstitution was dominated by effector memory cells with normal levels of IL-2 resuming 6 months following transplantation. In summary, both CD4 and CD8 CMV-specific immune reconstitution is required for protection from recurrent activation. Measurement of the magnitude of the CMV-specific CD4+ immune response is useful in managing viral reactivation following HSCT

    Estimation of Dietary Iron Bioavailability from Food Iron Intake and Iron Status

    Get PDF
    Currently there are no satisfactory methods for estimating dietary iron absorption (bioavailability) at a population level, but this is essential for deriving dietary reference values using the factorial approach. The aim of this work was to develop a novel approach for estimating dietary iron absorption using a population sample from a sub-section of the UK National Diet and Nutrition Survey (NDNS). Data were analyzed in 873 subjects from the 2000–2001 adult cohort of the NDNS, for whom both dietary intake data and hematological measures (hemoglobin and serum ferritin (SF) concentrations) were available. There were 495 men aged 19–64 y (mean age 42.7±12.1 y) and 378 pre-menopausal women (mean age 35.7±8.2 y). Individual dietary iron requirements were estimated using the Institute of Medicine calculations. A full probability approach was then applied to estimate the prevalence of dietary intakes that were insufficient to meet the needs of the men and women separately, based on their estimated daily iron intake and a series of absorption values ranging from 1–40%. The prevalence of SF concentrations below selected cut-off values (indicating that absorption was not high enough to maintain iron stores) was derived from individual SF concentrations. An estimate of dietary iron absorption required to maintain specified SF values was then calculated by matching the observed prevalence of insufficiency with the prevalence predicted for the series of absorption estimates. Mean daily dietary iron intakes were 13.5 mg for men and 9.8 mg for women. Mean calculated dietary absorption was 8% in men (50th percentile for SF 85 µg/L) and 17% in women (50th percentile for SF 38 µg/L). At a ferritin level of 45 µg/L estimated absorption was similar in men (14%) and women (13%). This new method can be used to calculate dietary iron absorption at a population level using data describing total iron intake and SF concentration

    Anticipated resource utilization for injury versus non-injury pediatric visits to emergency departments

    Get PDF
    Background Childhood injuries are increasingly treated in emergency departments (EDs) but the relationship between injury severity and ED resource utilization has not been evaluated. The objective of this study was to compare resource utilization for pediatric injury-related ED visits across injury-severity levels and with non-injury visits, using standardized, validated scales. Methods A retrospective analysis of 2004-2008 ED visits from the Pediatric Emergency Care Applied Research Network Core Data Project. Maximum Abbreviated Injury Scale severity (MAIS) and Severity Classification System (SCS) scores were calculated and compared. MAIS and SCS are ordinal scales from 1 (minor injury) to 6, and 1 (low anticipated resource utilization) to 5, respectively. ED length of stay (LOS) and admission percentages were calculated as comparative proxy measures of resource utilization. Results There were 763,733 injury visits and 2,328,916 non-injury visits, most with SCS of 2 or 3. Of the injured patients, 59.2 % had an MAIS of 1. ED LOS and admission percentage increased with increasing MAIS from 1-5. LOS and admission percentage increased with increasing SCS in both samples. Median LOS was shorter for injured versus non-injured patients with SCS 3-5. Non-injured patients with SCS 2-5 were more likely admitted than injured patients. Most injured patients had an SCS 3 with an MAIS 1-2, or an SCS 2 with an MAIS 1, with no correlation between the two scales. Conclusion While admission rates and LOS increase with increasing AIS and SCS severity, these two classification schemas do not reliably correlate. Similarly, ED visit metrics differ between injured and non-injured patients in similar SCS categories. Although AIS and SCS both have value, these differences should be considered when using these schemas in research and quality improvement

    Efficacy and Safety of Vancomycin Loading Doses in Critically Ill Patients with Methicillin-Resistant \u3ci\u3eStaphylococcus aureus\u3c/i\u3e Infection

    Get PDF
    Background: While vancomycin loading doses may facilitate earlier pharmacokinetic–pharmacodynamic target attainment, the impact of loading doses on clinical outcomes remains understudied. Critically ill patients are at highest risk of morbidity and mortality from methicillin resistant Staphylococcus aureus (MRSA) infection and hypothesized to most likely benefit from a loading dose. We sought to determine the association between receipt of a vancomycin loading dose and clinical outcomes in a cohort of critically ill adults. Methods: Four hundred and forty-nine critically ill patients with MRSA cultures isolated from blood or respiratory specimens were eligible for the study. Cohorts were established by receipt of a loading dose (⩾20 mg/kg actual body weight) or not. The primary outcome was clinical failure, a composite outcome of death within 30 days of first MRSA culture, blood cultures positive ⩾7 days, white blood cell count up to 5 days from vancomycin initiation, temperature up to 5 days from vancomycin initiation, or substitution (or addition) of another MRSA agent. Results: There was no difference in the percentage of patients experiencing clinical failure between the loading dose and no loading dose groups (74.8% versus 72.8%; p = 0.698). Secondary outcomes were also similar between groups, including mortality and acute kidney injury, as was subgroup analysis based on site of infection. Exploratory analyses, including assessment of loading dose based on quartiles and a multivariable logistic regression model showed no differences. Conclusion: Use of vancomycin loading doses was not associated with improved clinical outcomes in critically ill patients with MRSA infection

    Acceptability of policies to reduce consumption of red and processed meat:A population-based survey experiment

    Get PDF
    Policies to reduce meat consumption are needed to help achieve climate change targets, and could also improve population health. Public acceptability can affect the likelihood of policy implementation. This study estimated the acceptability of policies to reduce red and processed meat consumption, and whether acceptability differed when policies were framed as benefitting health or the environment. In an online experiment, 2215 UK adults rated the acceptability of six policies, presented in a randomised order. Prior to rating policies, participants were randomised to one of two framing conditions, with policy outcomes described either as benefitting health or the environment. Regression models examined differences in the primary outcome – policy acceptability (rated on a 7-point scale) – by framing. Labels were the most accepted policy (48% support), followed by a media campaign (45%), reduced availability (40%) and providing incentives (38%). Increasing price (27%) and banning advertising (26%) were the least accepted. A substantial proportion of participants neither supported nor opposed most policies (26–33%), although this fell to 16% for increasing price. There was no evidence that framing policy benefits from a health or environment perspective influenced acceptability (−0.06, 95%CIs: 0.18,0.07). Fewer than half of the UK sample expressed support for any of six policies to reduce meat consumption, regardless of framing measures as benefitting health or the environment. Conversely, fewer than half expressed opposition, with the exception of price, suggesting considerable scope to influence public opinion in support of meat reduction measures to meet environmental and health goals

    Autologous stem cell transplantation is an effective salvage therapy for primary refractory multiple myeloma

    Get PDF
    High-dose therapy and autologous stem cell transplantation (ASCT) have proven efficacy in patients with multiple myeloma responding well to induction therapy. For those who fail to achieve a stable partial response (PR), the effect of ASCT is unclear. We report on 126 patients identified from a national database, who underwent ASCT having achieved < PR after induction with modern induction regimens. The overall response rate was 86% (24% complete response). Patients with progressive disease at the time of transplantation had poorer outcomes than those with minimally responsive or stable disease, but clinical benefit was seen in all groups. Day 100 and 1-year nonrelapse mortalities were 2% and 4%, respectively. The 5-year relapse rate and progression-free survival were 84% and 14% (median, 18 months), respectively. The 5-year overall survival was 42% (median, 51 months). Our findings support the use of ASCT in myeloma patients responding suboptimally to modern induction therapies. Patients should not be excluded on the basis of refractoriness to induction, as ASCT is effective in this group conventionally considered to have a poor outcome. Comprehensive multivariate analysis identified no disparate subgroups, meaning ASCT is a reasonable strategy for all fit primary refractory patients
    corecore